Combining SMOTE and OVA with Deep Learning and Ensemble Classifiers for Multiclass Imbalanced

نویسندگان

چکیده

The classification of real-world problems alwaysconsists imbalanced and multiclass datasets. A dataset having unbalanced andmultiple classes will have an impact on the pattern modeland accuracy, which be decreased. Hence,oversampling method keeps class balanced avoids theoverfitting problem. purposes study were to handle multiclassimbalanced datasets improve effectivenessof model. This proposed a hybrid bycombining Synthetic Minority Oversampling Technique (SMOTE) One-Versus-All(OVA) with deep learning ensemble classifiers; stacking random forestalgorithms for data handling. Datasets consisting ofdifferent numbers imbalances are gained from UCI MachineLearning Repository. research outputs illustrated that presented methodacquired best accuracy value at 98.51% when classifierwas used evaluate model performance in new-thyroiddataset. using algorithm received higheraccuracy rate than other methods car, pageblocks, Ecolidatasets. In addition, highest ofclassification 98.47% dermatology where forest isused as classifier.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Managing Borderline and Noisy Examples in Imbalanced Classification by Combining SMOTE with Ensemble Filtering

Imbalance data constitutes a great difficulty for most algorithms learning classifiers. However, as recent works claim, class imbalance is not a problem in itself and performance degradation is also associated with other factors related to the distribution of the data as the presence of noisy and borderline examples in the areas surrounding class boundaries. This contribution proposes to extend...

متن کامل

Co-Multistage of Multiple Classifiers for Imbalanced Multiclass Learning

In this work, we propose two stochastic architectural models (CMC and CMC-M ) with two layers of classifiers applicable to datasets with one and multiple skewed classes. This distinction becomes important when the datasets have a large number of classes. Therefore, we present a novel solution to imbalanced multiclass learning with several skewed majority classes, which improves minority classes...

متن کامل

Oversampling for Imbalanced Learning Based on K-Means and SMOTE

Learning from class-imbalanced data continues to be a common and challenging problem in supervised learning as standard classification algorithms are designed to handle balanced class distributions. While different strategies exist to tackle this problem, methods which generate artificial data to achieve a balanced class distribution are more versatile than modifications to the classification a...

متن کامل

A Novel Ensemble Method for Imbalanced Data Learning: Bagging of Extrapolation-SMOTE SVM

Class imbalance ubiquitously exists in real life, which has attracted much interest from various domains. Direct learning from imbalanced dataset may pose unsatisfying results overfocusing on the accuracy of identification and deriving a suboptimal model. Various methodologies have been developed in tackling this problem including sampling, cost-sensitive, and other hybrid ones. However, the sa...

متن کامل

Combining Binary Classifiers for a Multiclass Problem with Differential Privacy

Multiclass classification problem is often solved by combing binary classifiers into ensembles. While this is required for inherently binary classifiers, such as SVM, it also provides performance advantages for other classifiers. In this paper, we address the problem of combining binary classifiers into ensembles in the differentially private data publishing framework, where the data privacy is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computer Science

سال: 2022

ISSN: ['1552-6607', '1549-3636']

DOI: https://doi.org/10.3844/jcssp.2022.732.742